3-Computer Science-Systems-Computer Vision-Algorithms-Features

feature detection methods

Feature detection can use point matching based on feature or boundary matching {feature detection methods}: corner detection, scale-invariant features (SIFT), and speeded-up robust features (SURF). Features are good image-category descriptors.

feature detection algorithms

Algorithms {feature detection algorithms} can detect features.

descriptor

Properties {X-variable, vision} {X descriptor, vision} can describe features and have components.

canonical factor analysis

Factor analysis has a basic method.

centroid method

Factor analysis can use centroids.

Correlation Analysis

Properties and features have relationships.

correspondence factor analysis

Factor-analysis methods can use variable frequencies relative to properties, find chi-square values, and find principal components.

disjoint principal component

Principal components can be independent.

eigenvalue-one criterion

Thresholds can be how many components have eigenvalues greater than one.

eigenvector projection

Unsupervised linear methods can find factors.

Evolutionary Programming

Models can add and subtract randomly selected variables, with crossing-over, and evaluate for "fitness" or best fit. Extinction can happen more or less frequently or in more or fewer species. More-frequent extinctions have fewer species. Values follow power laws, because events can cause few or many extinctions.

evolving factor analysis

Methods can analyze ordered data.

explained variance percentage

Methods can indicate component number required to reach 90% of total variance.

factorial design

Designs can try to ensure design-space sampling, even if one position varies.

Genetic Function Algorithm

Linear property sets can have different values, change values by crossing-over between related genes, and have random changes, to select best fit.

latent variable

Variables can be linear descriptor combinations.

linear discriminant analysis

Supervised methods, in which boundary surface minimizes region variance and maximizes between-region variance, can put compounds into groups by activity level.

linear learning machine

Supervised methods can divide n-dimensional space into regions using discriminant functions.

maximum-likelihood method

Factor-analysis methods can find factors.

multidimensional scaling

Metric or non-metric methods can analyze similarity or dissimilarity matrices to find dimension number and place objects in proper relative positions.

multivariate adaptive regression spline

Non-parametric methods can find factors.

Mutation and Selection Uncover Models

Models can add and subtract randomly selected variables, with no crossing-over, and evaluate for "fitness" or best fit. Low mutation rates allow natural selection to operate on populations to move toward fitter genotypes. Intermediate mutation rates cause population to move toward and away from fitter genotypes. High mutation rates make many genotypes with direction, so high mutation blocks selection processes.

For any mutation rate, if gene number is too great, change rate is too great, and organism becomes extinct {error catastrophe, extinction}. Therefore, gene number has a limit if organisms do not make new species or find new environments.

Perhaps, cells and ecosystems also have upper limits to complexity. Complexity can increase with migration or speciation.

non-linear iterative partial least squares

Unsupervised linear methods can represent data as product of score matrix, for original observations, and loading-matrix transform, for original factors.

non-linear mapping

Topological mapping factor-analysis method uses linear variable combinations to make two or three new variables.

predictive computational model

Property information can predict behavior.

principal-component analysis

Variable principal components can be linear-descriptor combinations. Unsupervised linear methods can represent data as product of score matrix, for original observations, and loading-matrix transform, for original factors. PCA factor-analysis method uses linear variable combinations to make two or three new variables. PCA reduces unimportant variables.

principal-component regression

Singular-value decomposition can find singular-value sets to predict and project regression to latent structures.

principal factor analysis

Modified PCA can find principal factors.

Procrustes analysis

Methods can identify similarity descriptor sets.

QR algorithm

Methods can diagonalize matrices.

rank annihilation

Unsupervised linear methods can find factors.

response-surface method

Three-level designs can have three factors that quantify response and factor relationships. RSM includes MLR, OLS, PCR, and PLS linear designs, non-linear regression analysis, and non-parametric methods such as ACE, NPLS, and MARS.

Scree-plot

Residual variance approaches constancy, and plotted slope levels off, depending on number of components {Scree-test, vision}.

singular-value decomposition

In unsupervised linear methods, correlation matrix can become product of score, eigenvalues, and loading matrices, with diagonalization using QR algorithm.

spectral-mapping analysis

Factor-analysis methods can first take data logarithms to eliminate outliers and then subtract means from rows and columns, to leave only variation, showing which variables are important and how much.

structure space

Spaces can have two or three principal components.

target-transformation factor analysis

Methods can rotate features to match known patterns, such as hypotheses or signatures.

Unsupervised Method

Factors and response variables can relate, without using factor information or predetermined models.

eight point algorithm

Methods {eight point algorithm} can find structures from motions.

facial expression recognition

People can recognize six basic facial expressions {facial expression recognition}: anger, disgust, fear, happiness, sadness, and surprise. Expressions have unique muscle activities {Facial Action Coding System}, grouped into Action Parts. Methods detect faces, extract features, and classify expressions. Classifying can use Gabor filters, Bézier volumes, Bayes and Bayesian network classifiers, and Hidden Markov Models.

Kalman filter

Gaussian filtering {Kalman filter} can use mean and variance parameters for normal distributions and can increase feature or pixel gain. Kalman filters are parametric, as opposed to particle filters. Kalman filters predict output from input.

local image analysis

First computer-vision stage is to find features, including invariants {local image analysis}. Invariants can be angles, local phase, and orientation.

particle filtering

Distributions can have representations as finite numbers of samples {particle, sample} defined on Markov chains {particle filtering}, rather than using parameters.

Sobel edge operator

Operators {Sobel edge operator} can detect edges.

support vector machine

Methods {support-vector machine} can detect shapes from image segmentation, using color, shape, and distances.

Related Topics in Table of Contents

3-Computer Science-Systems-Computer Vision-Algorithms

Drawings

Drawings

Contents and Indexes of Topics, Names, and Works

Outline of Knowledge Database Home Page

Contents

Glossary

Topic Index

Name Index

Works Index

Searching

Search Form

Database Information, Disclaimer, Privacy Statement, and Rights

Description of Outline of Knowledge Database

Notation

Disclaimer

Copyright Not Claimed

Privacy Statement

References and Bibliography

Consciousness Bibliography

Technical Information

Date Modified: 2022.0225